home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Columbia Kermit
/
kermit.zip
/
newsgroups
/
misc.20010921-20020314
/
000033_bsf@er6.rutgers.edu_Fri Oct 5 15:47:24 EDT 2001.msg
< prev
next >
Wrap
Text File
|
2020-01-01
|
6KB
|
163 lines
Article: 12841 of comp.protocols.kermit.misc
Path: newsmaster.cc.columbia.edu!phl-feed.news.verio.net!iad-peer.news.verio.net!news.verio.net!news.maxwell.syr.edu!lon1-news.nildram.net!195.8.68.195.MISMATCH!newspeer.clara.net!news.clara.net!btnet-peer!btnet!news-feed1.eu.concert.net!att541!ip.att.net!newsmonger.rutgers.edu!news-nb.rutgers.edu!not-for-mail
From: sp2 admin <bsf@er6.rutgers.edu>
Newsgroups: comp.unix.aix,comp.protocols.kermit.misc
Subject: Re: ftp download with help of a file
Date: 5 Oct 2001 19:05:53 GMT
Organization: Rutgers University
Lines: 144
Sender: bsf@er6.rutgers.edu
Message-ID: <9pl0ah$so3$1@newsmonger.rutgers.edu>
References: <5e23bad3.0110050335.3c8e127@posting.google.com> <9pkgfq$ipf$1@newsmaster.cc.columbia.edu>
NNTP-Posting-Host: er6.rutgers.edu
X-Trace: newsmonger.rutgers.edu 1002308753 29443 165.230.180.134 (5 Oct 2001 19:05:53 GMT)
X-Complaints-To: news_support@email.rutgers.edu
NNTP-Posting-Date: 5 Oct 2001 19:05:53 GMT
User-Agent: tin/1.4.5-20010409 ("One More Nightmare") (UNIX) (SunOS/5.7 (sun4u))
Xref: newsmaster.cc.columbia.edu comp.unix.aix:223822 comp.protocols.kermit.misc:12841
havent tried it, but from the ftp man page:
o If a - (hyphen) is specified for the parameter, standard input (stdin) is
used for read operations and standard output (stdout) is used for write
operations.
as long as you can get your files into stdin, you should be able to do
it that way...
In comp.unix.aix Frank da Cruz <fdc@watsun.cc.columbia.edu> wrote:
> In article <5e23bad3.0110050335.3c8e127@posting.google.com>,
> antonio <antonio.napoleone@bedag.ch> wrote:
> : We download datfiles for virusengines via ftp. Usually we are doing this
> : automatically, but since a while we are having problems by getting several
> : files in the same remotedirectory with the mget-command. In some
> : directorys, downloading of those files works fine but in other not. Now i
> : try to download file by file, but i'm not very experienced in doing this.
> : What i do is getting the FILELIST from the remoteserver and figure out the
> : filenames with help of the grep and awk commands. Now my list looks like
> : this:
> :
> : AVH32DLL.DL_
> : VIRSIG.DA_
> : VIRINFO.DA_
> : README.TX_
> :
> : Is it possible to connect to the ftp-host, get each filename from the
> : list, download it and close the connection, after i dowloaded every
> : file, or do i have to connect, get 1 file, close, and so on.
> :
> This would require a bit more flexibility than you'll find in the regular
> FTP client. C-Kermit 8.0:
> http://www.columbia.edu/kermit/ck80.html
> includes a new scriptable FTP client that will let you do this:
> http://www.columbia.edu/kermit/ftpclient.html
> A scripting tutorial is here:
> http://www.columbia.edu/kermit/ckscripts.html
> And an FTP-specific scripting tutorial is here:
> http://www.columbia.edu/kermit/ftpscripts.html
> And complete documentation is here:
> http://www.columbia.edu/kermit/ckermit3.html#x3
> Here is a script that gets the list of filenames:
> cd somelocaldirectory
> ftp open foo.bar.com /user:myname /password:secret
> if fail exit 1 Can't reach host
> if not \v(ftp_loggedin) exit 1 FTP login failed
> ftp cd blah/blah/somepath
> if fail exit 1 Directory change failed
> ftp get /namelist:mylist
> if fail exit 1 Can't get list of filenames
> ftp bye
> Obviously we don't recommend putting passwords in scripts; better methods
> are available. The method shown above was chosen for brevity. If you
> are using anonymous ftp, the command would be:
> ftp open foo.bar.com /anonymous
> Now you have the list of filenames in the local file called 'mylist'.
> At this point, you should consider what you want to do with it. One
> strategy, as you suggest, is to open a new FTP session for each file.
> This can be done as follows (still in Kermit):
> fopen /read \%c myfile
> if fail exit 1 Can't open file list
> while not \f_eof(\%c) {
> fread /line \%c filename
> if fail break
> ftp open foo.bar.com /user:myname /password:secret
> if fail exit 1 Can't reach host
> if not \v(ftp_loggedin) exit 1 FTP login failed
> ftp cd blah/blah/somepath
> ftp get \m(filename)
> if fail exit 1 \m(filename): Download failed
> ftp bye
> }
> This is a sort of brute-force attack, and I'm not sure it does what you
> want anyway. What happens if a download fails on a particular file?
> Here is a more elegant solution:
> cd somelocaldirectory
> delete *
> ftp open foo.bar.com /user:myname /password:secret
> if fail exit 1 Can't reach host
> if not \v(ftp_loggedin) exit 1 FTP login failed
> ftp cd blah/blah/somepath
> if fail exit 1 Directory change failed
> while true {
> ftp get /update *
> if success break
> }
> ftp bye
> Here we clean out any old copies of the files, make the FTP connection to
> the server, cd to the desired server directory, and ask it to send us all
> the files in update mode. This means: if I already have a current copy of
> a file, don't bother to send it, but if I don't, then please do send it.
> If this succeeds, we're done. If it fails, we try again, automatically
> skipping the files that were sent previously, and so on until all the files
> have been sent.
> We can make this script both more robust and more efficient:
> cd somelocaldirectory
> delete *
> while true {
> ftp open foo.bar.com /user:myname /password:secret
> if fail exit 1 Can't reach host
> if not \v(ftp_loggedin) exit 1 FTP login failed
> ftp cd blah/blah/somepath
> if fail exit 1 Directory change failed
> while true {
> ftp get /recover /update *
> if success goto done
> if not \v(ftp_connected) break
> }
> ftp bye
> }
> done:
>
> This allows for the case when the connection is lost. When this happens,
> the script automatically goes back and reestablishes the connection and
> restarts the download; if the connection is not lost, however, it does not
> needlessly break the connection and reestablish it. In case a long file was
> interrupted in the middle, the /RECOVER option makes the download resume
> from the point of failure.
> - Frank